Programming with a Differentiable Forth Interpreter
نویسندگان
چکیده
There are families of neural networks that can learn to compute any function, provided sufficient training data. However, given that in practice training data is scarce for all but a small set of problems, a core question is how to incorporate prior knowledge into a model. Here we consider the case of prior procedural knowledge, such as knowing the overall recursive structure of a sequence transduction program or the fact that a program will likely use arithmetic operations on real numbers to solve a task. To this end we present a differentiable interpreter for the programming language Forth. Through a neural implementation of the dual stack machine that underlies Forth, programmers can write program sketches with slots that can be filled with behaviour trained from program input-output data. As the program interpreter is end-to-end differentiable, we can optimize this behaviour directly through gradient descent techniques on user specified objectives, and also integrate the program into any larger neural computation graph. We show empirically that our interpreter is able to effectively leverage different levels of prior program structure and learn complex transduction tasks such as sequence sorting or addition with substantially less data and better generalisation over problem sizes. In addition, we introduce neural program optimisations based on symbolic computation and parallel branching that lead to significant speed improvements.
منابع مشابه
Purrr TOM SCHOUTEN January 30 , 2008
Purrr is a Forth dialect specifically tailored to flash ROM based microcontrollers. A Forth typically enables direct low–level machine access in a resource–friendly way while providing a solid base for constructing high–level abstractions. Purrr includes a purely functional compositional macro language for meta–programming. The Purrr implementation consists of an optimizing cross–compiler and a...
متن کاملRegularity Conditions for Non-Differentiable Infinite Programming Problems using Michel-Penot Subdifferential
In this paper we study optimization problems with infinite many inequality constraints on a Banach space where the objective function and the binding constraints are locally Lipschitz. Necessary optimality conditions and regularity conditions are given. Our approach are based on the Michel-Penot subdifferential.
متن کاملDifferentiable Functional Program Interpreters
Programming by Example (PBE) is the task of inducing computer programs from input-output examples. It can be seen as a type of machine learning where the hypothesis space is the set of legal programs in some programming language. Recent work on differentiable interpreters relaxes the discrete space of programs into a continuous space so that search over programs can be performed using gradient-...
متن کاملReal Time Data Acquisition and Control
This paper presents a laboratory environment for the development of real time data acquisition and control on the IBM-PC platform. The laboratory station involves the integration of low-cost computer technology with powerful software components which empower the student to efficiently and effectively construct real time systems. The software base integrates an editor, a spreadsheet, and a real ...
متن کاملSummary - TerpreT: A Probabilistic Programming Language for Program Induction
We study machine learning formulations of inductive program synthesis; that is, given input-output examples, synthesize source code that maps inputs to corresponding outputs. Our key contribution is TERPRET, a domain-specific language for expressing program synthesis problems. A TERPRET model is composed of a specification of a program representation and an interpreter that describes how progra...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017